Goto

Collaborating Authors

 middle finger


Human-Exoskeleton Kinematic Calibration to Improve Hand Tracking for Dexterous Teleoperation

Zhang, Haiyun, Gasperina, Stefano Dalla, Yousaf, Saad N., Tsuboi, Toshimitsu, Narita, Tetsuya, Deshpande, Ashish D.

arXiv.org Artificial Intelligence

Hand exoskeletons are critical tools for dexterous teleoperation and immersive manipulation interfaces, but achieving accurate hand tracking remains a challenge due to user-specific anatomical variability and donning inconsistencies. These issues lead to kinematic misalignments that degrade tracking performance and limit applicability in precision tasks. We propose a subject-specific calibration framework for exoskeleton-based hand tracking that estimates virtual link parameters through residual-weighted optimization. A data-driven approach is introduced to empirically tune cost function weights using motion capture ground truth, enabling accurate and consistent calibration across users. Implemented on the Maestro hand exoskeleton with seven healthy participants, the method achieved substantial reductions in joint and fingertip tracking errors across diverse hand geometries. Qualitative visualizations using a Unity-based virtual hand further demonstrate improved motion fidelity. The proposed framework generalizes to exoskeletons with closed-loop kinematics and minimal sensing, laying the foundation for high-fidelity teleoperation and robot learning applications.


I have been performing the ancient practice of palm reading for more than 40 years... here are my secrets so YOU can tell your own future

Daily Mail - Science & tech

Imagine looking down at your hands and knowing what the future holds. For centuries, people have professed to have the ability to read palms, analyzing lines, hand shape, finger length, texture and other details to predict what's coming. This pseudoscience of fortune telling, known as palmistry or chiromancy, claims to interpret lines and creases on the hand to predict aspects of your life, including love, health, creativity, intelligence and emotions. Followers of the ancient practice believe every mark and groove has a meaning that can differ based on length, curvature, and depth. Even the intersections of lines supposedly add up to a detailed analysis of a person's character and potential future outcomes.


Prompt-Propose-Verify: A Reliable Hand-Object-Interaction Data Generation Framework using Foundational Models

Juneja, Gurusha, Kumar, Sukrit

arXiv.org Artificial Intelligence

Diffusion models when conditioned on text prompts, generate realistic-looking images with intricate details. But most of these pre-trained models fail to generate accurate images when it comes to human features like hands, teeth, etc. We hypothesize that this inability of diffusion models can be overcome through well-annotated good-quality data. In this paper, we look specifically into improving the hand-object-interaction image generation using diffusion models. We collect a well annotated hand-object interaction synthetic dataset curated using Prompt-Propose-Verify framework and finetune a stable diffusion model on it. We evaluate the image-text dataset on qualitative and quantitative metrics like CLIPScore, ImageReward, Fedility, and alignment and show considerably better performance over the current state-of-the-art benchmarks.


A New Tool Helps Artists Thwart AI--With a Middle Finger

WIRED

When artificial intelligence image generators first rolled out, they seemed like magic. Churning out detailed imagery in minutes was, from one angle, a technical marvel. From another angle, though, it looked like mere mimicry. The models were trained on billions of images without anyone asking the humans behind them for permission. "They have sucked the creative juices of millions of artists," says Eva Toorenent, an illustrator who serves as the Netherlands adviser for the European Guild for Artificial Intelligence Regulation.


OpenAI Demonstrates Complex Manipulation Transfer from Simulation to Real World

IEEE Spectrum Robotics

In-hand manipulation is one of those things that's fairly high on the list of "skills that are effortless for humans but extraordinarily difficult for robots." Without even really thinking about it, we're able to adaptively coordinate four fingers and a thumb with our palm and friction and gravity to move things around in one hand without using our other hand--you've probably done this a handful (heh) of times today already, just with your cellphone. It takes us humans years of practice to figure out how to do in-hand manipulation robustly, but robots don't have that kind of time. Learning through practice and experience is still the way to go for complex tasks like this, and the challenge is finding a way to learn faster and more efficiently than just giving a robot hand something to manipulate over and over until it learns what works and what doesn't, which would probably take about a hundred years. Rather than wait a hundred years, researchers at OpenAI have used reinforcement learning to train a convolutional neural network to control a five-fingered Shadow hand to manipulate objects, all in just 50 hours.


Uber's Robo-Car Test in SF Is a Middle Finger to Regulators

WIRED

Uber's self-driving cars are now picking up riders in San Francisco--even as regulators say they aren't allowed to. Unlike Pennsylvania, where in September Uber launched its first pilot program, the state of California requires that companies testing autonomous tech apply for a permit with the Department of Motor Vehicles, have insurance for the technology, and publicly report data like crashes and "disengagements"--when the human operator takes back control to make sure the car operates safely. Uber has been testing its autonomous cars--a few dozen retrofitted Volvo XC90 SUVs--for weeks in San Francisco, without a permit and without following those rules. "We didn't get a permit in California because we don't believe we need one," says Shari Doherty, a spokesperson for Uber. In a blog post, the company's autonomous tech chief, Anthony Levandowski, asserted that the rules only apply to cars that can drive themselves without a human supervisor, while Uber's cars have human engineers at the wheel, ready to take over if necessary.